The Interplay between Entropy and Variational Distance Part I: Basic Concepts and Bounds
نویسندگان
چکیده
For two probability distributions with finite alphabets, a small variational distance between them does not imply that the difference between their entropies is small if one of the alphabet sizes is unknown. This fact, seemingly contradictory to the continuity of entropy for finite alphabet, is clarified in the current paper by means of certain bounds on the entropy difference between two probability distributions in terms of the variational distance between them and their alphabet sizes. These bounds are shown to be the tightest possible. The Lagrange multiplier cannot be applied here because the variational distance is not differentiable. We also show how to find the distribution achieving the minimum (or maximum) entropy among those distributions within a given variational distance from any given distribution.
منابع مشابه
Some properties of the parametric relative operator entropy
The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...
متن کاملNetwork Error Correction, Part I: Basic Concepts and Upper Bounds
Error correction in existing point-to-point communication networks is done on a link-by-link basis, which is referred to in this paper as classical error correction. Inspired by network coding, we introduce in this two-part paper a new paradigm called network error correction. The theory thus developed subsumes classical algebraic coding theory as a special case. In Part I, we discuss the basic...
متن کاملSharp Upper bounds for Multiplicative Version of Degree Distance and Multiplicative Version of Gutman Index of Some Products of Graphs
In $1994,$ degree distance of a graph was introduced by Dobrynin, Kochetova and Gutman. And Gutman proposed the Gutman index of a graph in $1994.$ In this paper, we introduce the concepts of multiplicative version of degree distance and the multiplicative version of Gutman index of a graph. We find the sharp upper bound for the multiplicative version of degree distance and multiplicative ver...
متن کاملOn Complementary Distance Signless Laplacian Spectral Radius and Energy of Graphs
Let $D$ be a diameter and $d_G(v_i, v_j)$ be the distance between the vertices $v_i$ and $v_j$ of a connected graph $G$. The complementary distance signless Laplacian matrix of a graph $G$ is $CDL^+(G)=[c_{ij}]$ in which $c_{ij}=1+D-d_G(v_i, v_j)$ if $ineq j$ and $c_{ii}=sum_{j=1}^{n}(1+D-d_G(v_i, v_j))$. The complementary transmission $CT_G(v)$ of a vertex $v$ is defined as $CT_G(v)=sum_{u in ...
متن کاملIRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES An Information-Theoretic Perspective of the Poisson Approximation via the Chen- Stein Method
The first part of this work considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived via the Chen-Stein method. The second part of this work derives new lower bounds on the total variat...
متن کامل